Summary

Top Articles:

  • Skip the Surveillance By Opting Out of Face Recognition At Airports
  • Amazon Ring Must End Its Dangerous Partnerships With Police

Amazon Ring Must End Its Dangerous Partnerships With Police

Published: 2020-06-10 21:12:09

Popularity: 2567

Author: Jason Kelley

Keywords:

  • Call To Action
  • Privacy
  • Digital Rights and the Black-led Movement Against Police Violence
  • Street-Level Surveillance
  • Across the United States, people are taking to the street to protest racist police violence, including the tragic police killings of George Floyd and Breonna Taylor. This is a historic moment of reckoning for law enforcement. Technology companies, too, must rethink how the tools they design and sell to police departments minimize accountability and exacerbate injustice. Even worse, some companies profit directly from exploiting irrational fears of crime that all too often feed the flames of police brutality. So we’re calling on Amazon Ring, one of the worst offenders, to immediately end the partnerships it holds with over 1300 law enforcement agencies. SIGN PETITION TELL AMAZON RING: END POLICE PARTNERSHIPS  One by one, companies that profit off fears of crime have released statements voicing solidarity with those communities that are disproportionately impacted by police violence. Amazon, which owns Ring, announced that they “stand in solidarity with the Black community—[their] employees, customers, and partners — in the fight against systemic racism and injustice.”  Amazon Statement And yet, Amazon and other companies offer a high-speed digital mechanism by which people can make snap judgements about who does, and who does not, belong in their neighborhood, and summon police to confront them. This mechanism also facilitates police access to video and audio footage from massive numbers of doorbell cameras aimed at the public way across the country—a feature that could conceivably be used to identify participants in a protest through a neighborhood. Amazon built this surveillance infrastructure through tight-knit partnerships with police departments, including officers hawking Ring’s cameras to residents, and Ring telling officers how to better pressure residents to share their videos. Ring plays an active role in enabling and perpetuating police harassment of Black Americans. Despite Amazon’s statement that “the inequitable and brutal treatment of Black people in our country must stop,” Ring plays an active role in enabling and perpetuating police harassment of Black Americans. Ring’s surveillance doorbells and its accompanying Neighbors app have inflamed many residents’ worst instincts and urged them to spy on pedestrians, neighbors, and workers. We must tell Amazon Ring to end their police partnerships today.  Ring Threatens Privacy and Communities We’ve written extensively about why Ring is a “Perfect Storm of Privacy Threats,” and we’ve laid out five specific problems with Ring-police partnerships. We also revealed a number of previously-undisclosed trackers sending information from the Ring app to third parties, and critiqued the lackluster changes made in response to security flaws.  To start, Ring sends notifications to a person’s phone every time the doorbell rings or motion near the door is detected. With every notification, Ring turns the pizza delivery person or census-taker innocently standing at the door into a potential criminal. And with the click of a button, Ring allows a user to post video taken from that camera directly to their community, facilitating the reporting of so-called “suspicious” behavior. This encourages racial profiling—take, for example, an African-American real estate agent who was stopped by police because neighbors thought it was “suspicious” for him to ring a doorbell.  Ring Could Be Used to Identify Protesters To make matters worse, Ring continuing to grow partnerships with police departments during the current protests make an arrangement already at risk of enabling racial profiling even more troubling and dangerous. Ring now has relationships with over 1300 police departments around the United States. These partnerships allow police to have a general idea of the location of every Ring camera in town, and to make batch-requests for footage via email to every resident with a camera within an area of interest to police—potentially giving police a one-step process for requesting footage of protests to identify protesters. In some towns, the local government has even offered tiered discount rates for the camera based on how much of the public area on a street the Ring will regularly capture. The more of the public space it captures, the larger the discount.  If a Ring camera captures demonstrations, the owner is at risk of making protesters identifiable to police and vulnerable to retribution. Even if the camera owner refuses to voluntarily share footage of a protest with police, law enforcement can go straight to Amazon with a warrant and thereby circumvent the camera’s owner.  Ring Undermines Public Trust In Police The rapid proliferation of these partnerships between police departments and the Ring surveillance system—without oversight, transparency, or restrictions—poses a grave threat to the privacy and safety of all people in the community. “Fear sells,” Ring posted on their company blog in 2016. Fear also gets people hurt, by inflaming tensions and creating suspicion where none rationally exists.  Consider that Amazon also encourages police to tell residents to install the Ring app and purchase cameras for their homes, in an arrangement that makes salespeople out of what should be impartial and trusted protectors of our civic society. Per Motherboard, for every town resident that downloads Ring’s Neighbors app, the local police department gets credits toward buying cameras it can distribute to residents. This troubling relationship is worse than uncouth: it’s unsafe and diminishes public trust. Some of the “features” Ring has been considering adding would considerably increase the danger it poses. Integrated face recognition software would enable the worst type of privacy invasion of individuals, and potentially force every person approaching a Ring doorbell to have their face scanned and cross-checked against a database of other faces without their consent. License plate scanning could match people’s faces to their cars. Alerting users to local 911 calls as part of the “crime news” alerts on its app, Neighbors, would instill even more fear, and probably sell additional Ring services.  Just today Amazon announced a one-year moratorium on police use of its dangerous "Rekognition" facial recognition tool. This follows an announcement from IBM that it will no longer develop or research face recognition technology, in part because of its use in mass surveillance, policing, and racial profiling. We're glad Amazon has admitted that the unregulated use of face recognition can do harm to vulnerable communities. Now it's time for it to admit the dangers of Ring-police partnerships, and stand behind its statement on police brutality.  SIGN PETITION TELL AMAZON RING: END POLICE PARTNERSHIPS

    ...more

    Skip the Surveillance By Opting Out of Face Recognition At Airports

    Published: 2019-04-25 04:38:56

    Popularity: 6724

    Author: Jason Kelley

    Keywords:

  • Commentary
  • Biometrics
  • Government agencies and airlines have ignored years of warnings from privacy groups and Senators that using face recognition technology on travelers would massively violate their privacy. Now, the passengers are in revolt as well, and they’re demanding answers. Last week, a lengthy exchange on Twitter between a traveler who was concerned about her privacy and a spokesperson for the airline JetBlue went viral, and many of the questions asked by the traveler and others were the same ones that we’ve posed to Customs and Border Protection (CBP) officials: Where did you get my data? How is it protected? Which airports will use this? Where in the airports will it be used? Most importantly, how do I opt-out? Right now, the key to opting out of face recognition is to be vigilant. How to Opt Out These questions should be simple to answer, but we haven’t gotten simple answers. When we asked CBP for more information, they told us: “visit our website.” We did, and we still have many of the same questions. Representatives for airlines, which partner directly with the government agencies, also seem unable to answer the concerns, as the JetBlue spokesperson made evident. Both agencies and airlines seemed to expect no pushback from passengers when they implemented this boarding-pass-replacing-panopticon. The convenience would win out, they seemed to assume, not expecting people to mind having their face scanned “the same way you unlock your phone.” But now that “your face is your boarding pass” (as JetBlue awkwardly puts it), at least in some airports, the invasive nature of the system is much more clear, and travelers are understandably upset. It might sound trite, but right now, the key to opting out of face recognition is to be vigilant. There’s no single box you can check, and importantly, it may not be possible for non-U.S. persons to opt out of face recognition entirely. For those who can opt out, you’ll need to spot the surveillance when it’s happening. To start, TSA PreCheck, Clear, and other ways of "skipping the line" often require biometric identification, and are often being used as test cases for these sorts of programs. Once you’re at the airport, be on the lookout for any time a TSA, CBP, or airline employee asks you to look into a device, or when there’s a kiosk or signage like those below. That means your biometric data is probably about to be scanned. At the moment, face recognition is most likely to happen at specific airports, including Atlanta, Chicago, Seattle, San Francisco, Las Vegas, Los Angeles, Washington (Dulles and Reagan), Boston, Fort Lauderdale, Houston Hobby, Dallas/Fort Worth, JFK, Miami, San Jose, Orlando, and Detroit; while flying on Delta, JetBlue, Lufthansa, British Airways and American Airlines; and in particular, on international flights. But, that doesn’t mean that other airlines and airports won’t implement it sooner rather than later. To skip the surveillance, CBP says you “should notify a CBP Officer or an airline or airport representative in order to seek an alternative means of verifying [your] identity and documents.” Do the same when you encounter this with an airline. While there should be signage near the face recognition area, it may not be clear. If you’re concerned about creating a slight delay for yourself or other passengers, take note: though CBP has claimed to have a 98% accuracy rating in their pilot programs, the Office of the Inspector General could not verify those numbers, and even a 2% error rate would cause thousands of people to be misidentified every day. Most face recognition technology has significantly lower accuracy ratings than that, so you might actually be speeding things up by skipping the surveillance. The Long And Winding Biometric Pathway Part of the reason for the confusion about how to opt out is that there are actually (at least) three different face recognition checkpoints looming: Airlines want to use your face as your boarding pass, saying “it's about convenience.” CBP, which is part of the Department of Homeland Security (DHS), wants to use your face to check against DHS and State Department databases when you’re entering or exiting the country; and the TSA wants to compare your face against your photo identification throughout the airport. And if people are upset now, they will be furious to know this is just the beginning of the “biometric pathway” program: CBP and TSA want to use face recognition and other biometric data to track everyone from check-in, through security, into airport lounges, and onto flights (PDF). They’re moving fast, too, despite (or perhaps because of) the fact that there are no regulations on this sort of technology: DHS is hoping to use facial recognition on 97 percent of departing air passengers within the next four years and 100 percent of all international passengers in the top 20 U.S. airports by 2021. It’s the customers and passengers who will bear the burden when things go wrong, If the government agencies get their way, new biometric data could be taken from/used against travelers wherever they are in the airport—and much of that collection will be implemented by private companies (even rental car companies are getting in on the action). CBP will store that facial recognition data for two weeks for U.S. citizens and lawful permanent residents, and for 75+ years for non-U.S. persons. In addition, the biometric data collected by at least some of these systems in the future—which can include your fingerprints, the image of your face, and the scan of your iris—will be stored in FBI and DHS databases and will be searched again and again for immigration, law enforcement, and intelligence checks, including checks against latent prints associated with unsolved crimes. Passengers Will Bear the Burden of Privacy Invasion, Not Airlines or Government Agencies It’s easy for companies and agencies to tout the convenience of this sort of massive data collection and sharing scheme. But as we’ve seen in notable privacy fiascos over the last few years—from Facebook’s Cambridge Analytica scandal, to the breaches of the Office of Personnel Management and Equifax in the U.S., to the constant hacking of India’s national biometric database, Aadhar—it’s the customers and passengers who will bear the burden when things go wrong, and they will go wrong. These vast biometric databases will create huge security and privacy risks, with the additional concern that a company leaking your passwords or credit card numbers is nothing compared to it leaking your biometric data. While you can change a password, you can’t easily change your face. Additionally, these systems are notoriously inaccurate, contain out-of-date information, and due to the fact that immigrants and people of color are disproportionately represented in criminal and immigration databases, and that face recognition systems are less capable of identifying people of color, women, and young people, the weight of these inaccuracies will fall disproportionately on them. It will be the passengers who bear the burden when they are stuck watching the flights they paid for take off without them because there was an error with a database or an algorithm, or because they preferred non-biometric options that weren’t in place. It’s time for the government agencies and the airlines to pause these programs until they can clearly and adequately give: Photographs of the signage in-situ in the airports in question, as well as any additional information about the opt-out process. An explanation of the locations where CBP will be providing meaningful and clear opt out notice to travelers (for example, at entry points, point-of-sale, ticket counters, security checkpoints, and boarding gates) as well as the specific language travelers can use to opt out of the biometric data collection program. An up-to-date list of all the airports and airlines that currently participate in the biometric exit program. Information about the algorithm CBP is using to compare photos (provided by NEC), as well as the accuracy information associated with that algorithm. Technological specifications for transferring data from point of collection to DHS and with vendors and airlines. Additional questions—like how data is safeguarded—are laid out in our letter to CBP. Congress must also demand the answers to these questions. And lawmakers must require agencies and airlines to pause this program until they can not only ensure the biometric privacy of travelers is protected but more importantly justify this huge invasion of privacy. Just last month, three Senators released a joint statement calling on DHS to pause the program until there can be “a rulemaking to establish privacy and security rules of the road,” but so far, they’ve been ignored. Trading privacy for convenience is a bad bargain, and it can feel like the deal isn’t always one we have a choice in. DHS has said that the only way we can ensure that our biometric data isn’t collected when we travel is to “refrain from traveling.” That’s ridiculous. The time to regulate and restrict the use of  facial recognition technology is now, before it becomes embedded in our everyday lives. We must keep fighting to make sure that in the future, it gets easier, and not harder, to defend our privacy—biometric or otherwise.

    ...more

    end